Lifted inference: normalizing loops by evaluation

نویسندگان

  • Oleg Kiselyov
  • Chung-chieh Shan
چکیده

Many loops in probabilistic inference map almost every individual in their domain to the same result. Running such loops symbolically takes time sublinear in the domain size. Using normalization by evaluation with first-class delimited continuations, we lift inference procedures to reap this speed-up without interpretive overhead. To express nested loops, we use multiple control delimiters for metacircular interpretation. To express loops over a powerset domain, we convert nested loops over a subset to unnested loops.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Approximate Lifting Techniques for Belief Propagation

Many AI applications need to explicitly represent relational structure as well as handle uncertainty. First order probabilistic models combine the power of logic and probability to deal with such domains. A naive approach to inference in these models is to propositionalize the whole theory and carry out the inference on the ground network. Lifted inference techniques (such as lifted belief prop...

متن کامل

Tractable Learning of Liftable Markov Logic Networks

Markov logic networks (MLNs) are a popular statistical relational learning formalism that combine Markov networks with first-order logic. Unfortunately, inference and maximum-likelihood learning with MLNs is highly intractable. For inference, this problem is addressed by lifted algorithms, which speed up inference by exploiting symmetries. State-of-the-art lifted algorithms give tractability gu...

متن کامل

Initial Empirical Evaluation of Anytime Lifted Belief Propagation

Lifted first-order probabilistic inference, which manipulates first-order representations of graphical models directly, has been receiving increasing attention. Most lifted inference methods to date need to process the entire given model before they can provide information on a query’s answer, even if most of it is determined by a relatively small, local portion of the model. Anytime Lifted Bel...

متن کامل

First-order Decomposition Trees

Lifting attempts to speedup probabilistic inference by exploiting symmetries in the model. Exact lifted inference methods, like their propositional counterparts, work by recursively decomposing the model and the problem. In the propositional case, there exist formal structures, such as decomposition trees (dtrees), that represent such a decomposition and allow us to determine the complexity of ...

متن کامل

Aggregation and Constraint Processing in Lifted Probabilistic Infrence

Representations that mix graphical models and first-order logic—called either firstorder or relational probabilistic models—were proposed nearly twenty years ago and many more have since emerged. In these models, random variables are parameterized by logical variables. One way to perform inference in first-order models is to propositionalize the model, that is, to explicitly consider every elem...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2009